116 research outputs found

    Central Limit Theorems for Wavelet Packet Decompositions of Stationary Random Processes

    Full text link
    This paper provides central limit theorems for the wavelet packet decomposition of stationary band-limited random processes. The asymptotic analysis is performed for the sequences of the wavelet packet coefficients returned at the nodes of any given path of the MM-band wavelet packet decomposition tree. It is shown that if the input process is centred and strictly stationary, these sequences converge in distribution to white Gaussian processes when the resolution level increases, provided that the decomposition filters satisfy a suitable property of regularity. For any given path, the variance of the limit white Gaussian process directly relates to the value of the input process power spectral density at a specific frequency.Comment: Submitted to the IEEE Transactions on Signal Processing, October 200

    Analyse des Séquences de Processus et Champs Aléatoires d'Ondelettes

    Get PDF
    Cette synthèse de travaux de recherche porte sur l'étude des processus aléatoires non-stationnaires lorsque ceux-ci sont projetés sur les bases fonctionnelles d'ondelettes. Dans les références citées, une analyse fine de tous les facteurs contribuant à la stationnarité et à la décorrélation des coefficients de projection est proposée, en fonction des structures de dépendances statistiques inhérentes au processus considéré. Cette analyse met en évidence 2 issues pour une même décomposition en ondelettes : certaines fonctions de la base d'ondelettes ont pour effet de casser les dépendances statistiques intrinsèques au processus décomposé tandis que d'autres fonctions concentrent ces dépendances dans des sous-espaces d'ondelettes spécifiques. En pratique, l'identification des sous espaces d'ondelettes associés à ces 2 issues permet de simplifier la sélection de modèles de descriptions statistiques et/ou probabilistes du processus analysé. On déduit ainsi que de nombreux champs stochastiques `textures' présents dans les images numériques peuvent être décrits de manière parcimonieuse par le biais de modèles paramétriques associés à leurs séquences de coefficients d'ondelettes. Ces descriptions sont utilisées dans les références citées pour proposer des méthodes de classification de textures, de recherche de contenu spécifique dans une base d'images et de détection de changements dans les séries temporelles d'images

    Best Basis for Joint Representation: the Median of Marginal Best Bases for Low Cost Information Exchanges in Distributed Signal Representation

    Get PDF
    International audienceThe paper addresses the selection of the best representations for distributed and/or dependent signals. Given an indexed tree structured library of bases and a semi-collaborative distribution scheme associated with minimum information exchange (emission and reception of one single index corresponding to a marginal best basis), the paper proposes the median basis computed on a set of best marginal bases for joint representation or fusion of distributed/dependent signals. The paper provides algorithms for computing this median basis with respect to standard tree structured libraries of bases such as wavelet packet bases or cosine trees. These algorithms are effective when an additive information cost is under consideration. Experimental results performed on distributed signal compression confirms worthwhile properties for the median of marginal best bases with respect to the ideal best joint basis, the latter being underdetermined in practice, except when a full collaboration scheme is under consideration

    Stochasticity: A Feature for the Structuring of Large and Heterogeneous Image Databases

    Get PDF
    International audienceThe paper addresses image feature characterization and the structuring of large and heterogeneous image databases through the stochasticity or randomness appearance. Measuring stochasticity involves finding suitable representations that can significantly reduce statistical dependencies of any order. Wavelet packet representations provide such a framework for a large class of stochastic processes through an appropriate dictionary of parametric models. From this dictionary and the Kolmogorov stochasticity index, the paper proposes semantic stochasticity templates upon wavelet packet sub-bands in order to provide high level classification and content-based image retrieval. The approach is shown to be relevant for texture images

    Control of Discrete Event Systems with Respect to Strict Duration: Supervision of an Industrial Manufacturing Plant

    Get PDF
    26 pagesInternational audienceIn this paper, we propose a (max,+)-based method for the supervi- sion of discrete event systems subject to tight time constraints. Systems under consideration are those modelled as timed event graphs and repre- sented with linear (max,+) state equations. The supervision is addressed by looking for solutions of constrained state equations associated with timed event graph models. These constrained state equations are derived by reducing duration constraints to elementary constraints whose con- tributions are injected in the system's state equations. An example for supervisor synthesis is given for an industrial manufacturing plant subject to a strict temporal constraint, the thermal treatment of rubber parts for the automotive industries. Supervisors are calculated and classified ac- cording to their performance, considering their impact on the production throughput

    On the Statistical Decorrelation of the Wavelet Packet Coefficients of a Band-Limited Wide-Sense Stationary Random Process

    No full text
    International audienceThis paper is a contribution to the analysis of the statistical correlation of the wavelet packet coefficients resulting from the decomposition of a random process, stationary in the wide-sense, whose power spectral density is bounded with support in [-\pi,\pi]. Consider two quadrature mirror filters (QMF) that depend on a parameter r, such that these filters tend almost everywhere to the Shannon QMF when r increases. The parameter r is called the order of the QMF under consideration. The order of the Daubechies filters (resp. the Battle-Lemarié filters) is the number of vanishing moments of the wavelet function (resp. the spline order of the scaling function). Given any decomposition path in the wavelet packet tree, the wavelet packet coefficients are proved to decorrelate for every packet associated with a large enough resolution level, provided that the QMF order is large enough and above a value that depends on this wavelet packet. Another consequence of our derivation is that, when the coefficients associated with a given wavelet packet are approximately decorrelated, the value of the autocorrelation function of these coefficients at lag 00 is close to the value taken by the power spectral density of the decomposed process at a specific point. This specific point depends on the path followed in the wavelet packet tree to attain the wavelet packet under consideration. Some simulations highlight the good quality of the ''whitening'' effect that can be obtained in practical cases

    Wavelet Shrinkage: Unification of Basic Thresholding Functions and Thresholds

    No full text
    International audienceThis work addresses the unification of some basic functions and thresholds used in non-parametric estimation of signals by shrinkage in the wavelet domain. The Soft and Hard thresholding functions are presented as degenerate \emph{smooth sigmoid based shrinkage} functions. The shrinkage achieved by this new family of sigmoid based functions is then shown to be equivalent to a regularisation of wavelet coefficients associated with a class of penalty functions. Some sigmoid based penalty functions are calculated, and their properties are discussed. The unification also concerns the universal and the minimax thresholds used to calibrate standard Soft and Hard thresholding functions: these thresholds pertain to a wide class of thresholds, called the detection thresholds. These thresholds depend on two parameters describing the sparsity degree for the wavelet representation of a signal. It is also shown that the non-degenerate sigmoid shrinkage adjusted with the new detection thresholds is as performant as the best up-to-date parametric and computationally expensive method. This justifies the relevance of sigmoid shrinkage for noise reduction in large databases or large size images

    Detection threshold for non-parametric estimation

    No full text
    International audienceA new threshold is presented for better estimating a signal by sparse transform and soft thresholding. This threshold derives from a non-parametric statistical approach dedicated to the detection of a signal with unknown distribution and unknown probability of presence in independent and additive white Gaussian noise. This threshold is called the detection threshold and is particularly appropriate for selecting the few observations, provided by the sparse transform, whose amplitudes are sufficiently large to consider that they contain information about the signal. An upper bound for the risk of the soft thresholding estimation is computed when the detection threshold is used. For a wide class of signals, it is shown that, when the number of observations is large, this upper bound is from about twice to four times smaller than the standard upper bounds given for the universal and the minimax thresholds. Many real-world signals belong to this class, as illustrated by several experimental results

    Smooth Adaptation by Sigmoid Shrinkage

    No full text
    International audienceThis work addresses the properties of a sub-class of sigmoid based shrinkage functions: the non zero-forcing smooth sigmoid based shrinkage functions or SigShrink functions. It provides a SURE optimization for the parameters of the SigShrink functions. The optimization is performed on an unbiased estimation risk obtained by using the functions of this sub-class. The SURE SigShrink performance measurements are compared to those of the SURELET (SURE linear expansion of thresholds) parameterization. It is shown that the SURE SigShrink performs well in comparison to the SURELET parameterization. The relevance of SigShrink is the physical meaning and the flexibility of its parameters. The SigShrink functions perform weak attenuation of data with large amplitudes and stronger attenuation of data with small amplitudes, the shrinkage process introducing little variability among data with close amplitudes. In the wavelet domain, SigShrink is particularly suitable for reducing noise without impacting significantly the signal to recover. A remarkable property for this class of sigmoid based functions is the invertibility of its elements. This property makes it possible to smoothly tune contrast (enhancement - reduction)

    Wavelet Packets of fractional Brownian motion: Asymptotic Analysis and Spectrum Estimation

    No full text
    International audienceThis work provides asymptotic properties of the autocorrelation functions of the wavelet packet coefficients of a fractional Brownian motion. It also discusses the convergence speed to the limit autocorrelation function, when the input random process is either a fractional Brownian motion or a wide-sense stationary second-order random process. The analysis concerns some families of wavelet paraunitary filters that converge almost everywhere to the Shannon paraunitary filters. From this analysis, we derive wavelet packet based spectrum estimation for fractional Brownian motions and wide-sense stationary random processes. Experimental tests show good results for estimating the spectrum of 1/f processes
    • …
    corecore